This is an example from Fermat's library.
On average, if you select numbers uniformly from $[0, 1]$ how many will it take to sum to $> 1$?
The answer is $\large{e}$ $\approx 2.718 \dots$
The simulation is straightforward and the proof is quite clever.
function meancount(N)
    t = 0
    for i in 1:N
        s = 0.0
        while s <= 1.0
            s += rand()
            t += 1
        end
    end
    return t / N
end;
Julia is very strong at simulations so can run a big number quite quickly. On my machine running 100 million simulations takes less than 5 seconds.
@time meancount(100_000_000)
Some notation:
$N_x$ is the number of values selected from $[0,1]$ that you need to sum to greater than $x$. $m_x = \mathbb{E}(N_x)$ i.e. $m_x$ is the expected number of selections to sum to greater than $x$. And lastly, $s$ is the first value selected (from $[0,1]$).
We want $m_1$.
Now
$$
  \begin{cases}
    \text{if } s>1 &  N_1 = 1 \hspace{2em} \text{(But impossible since $s \in [0,1]$.)}
    \hspace{22em} \\[0.8em]
    \text{if } s<=1 & N_1 = 1 + N_{1-s}
  \end{cases}
$$
Then, since $s$ takes any value from $0$ to $1$ we have:
And then it's easy to verify that $m_x = e\,^x$ is a solution: